DE eng

Search in the Catalogues and Directories

Hits 1 – 5 of 5

1
Self-Supervised Curriculum Learning for Spelling Error Correction ...
BASE
Show details
2
Multi-Head Highly Parallelized LSTM Decoder for Neural Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.23 Abstract: One of the reasons Transformer translation models are popular is that self-attention networks for context modelling can be easily parallelized at sequence level. However, the computational complexity of a self-attention network is $O(n^2)$, increasing quadratically with sequence length. By contrast, the complexity of LSTM-based approaches is only O(n). In practice, however, LSTMs are much slower to train than self-attention networks as they cannot be parallelized at sequence level: to model context, the current LSTM state relies on the full LSTM computation of the preceding state. This has to be computed n times for a sequence of length n. The linear transformations involved in the LSTM gate and state computations are the major cost factors in this. To enable sequence-level parallelization of LSTMs, we approximate full LSTM context modelling by computing hidden states and gates with the current input and a simple bag-of-words representation ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/fcc7-e373
https://underline.io/lecture/25374-multi-head-highly-parallelized-lstm-decoder-for-neural-machine-translation
BASE
Hide details
3
Modeling Task-Aware MIMO Cardinality for Efficient Multilingual Neural Machine Translation ...
BASE
Show details
4
Transformer-based NMT : modeling, training and implementation
Xu, Hongfei. - : Saarländische Universitäts- und Landesbibliothek, 2021
BASE
Show details
5
Probing Word Translations in the Transformer and Trading Decoder for Encoder Layers ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
5
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern